Dropout as a Low-Rank Regularizer for Matrix Factorization
نویسندگان
چکیده
Regularization for matrix factorization (MF) and approximation problems has been carried out in many different ways. Due to its popularity in deep learning, dropout has been applied also for this class of problems. Despite its solid empirical performance, the theoretical properties of dropout as a regularizer remain quite elusive for this class of problems. In this paper, we present a theoretical analysis of dropout for MF, where Bernoulli random variables are used to drop columns of the factors. We demonstrate the equivalence between dropout and a fully deterministic model for MF in which the factors are regularized by the sum of the product of squared Euclidean norms of the columns. Additionally, we inspect the case of a variable sized factorization and we prove that dropout achieves the global minimum of a convex approximation problem with (squared) nuclear norm regularization. As a result, we conclude that dropout can be used as a low-rank regularizer with data dependent singular-value thresholding.
منابع مشابه
An Analysis of Dropout for Matrix Factorization
Dropout is a simple yet effective algorithm for regularizing neural networks by randomly dropping out units through Bernoulli multiplicative noise, and for some restricted problem classes, such as linear or logistic regression, several theoretical studies have demonstrated the equivalence between dropout and a fully deterministic optimization problem with data-dependent Tikhonov regularization....
متن کاملSmooth incomplete matrix factorization and its applications in image/video denoising
Low-rank matrix factorization with missing elements has many applications in computer vision. However, the original model without taking any prior information, which is to minimize the total reconstruction error of all the observed matrix elements, sometimes provides a physically meaningless solution in some applications. In this paper, we propose a regularized low-rank factorization model for ...
متن کاملAlternating Iteratively Reweighted Minimization Algorithms for Low-Rank Matrix Factorization
Nowadays, the availability of large-scale data in disparate application domains urges the deployment of sophisticated tools for extracting valuable knowledge out of this huge bulk of information. In that vein, low-rank representations (LRRs) which seek low-dimensional embeddings of data have naturally appeared. In an effort to reduce computational complexity and improve estimation performance, ...
متن کاملRank-One Matrix Completion with Automatic Rank Estimation via L1-Norm Regularization
Completing a matrix from a small subset of its entries, i.e., matrix completion, is a challenging problem arising from many real-world applications, such as machine learning and computer vision. One popular approach to solving the matrix completion problem is based on low-rank decomposition/factorization. Low-rank matrix decomposition-based methods often require a pre-specified rank, which is d...
متن کاملDropout Training as Adaptive Regularization
Dropout and other feature noising schemes control overfitting by artificially corrupting the training data. For generalized linear models, dropout performs a form of adaptive regularization. Using this viewpoint, we show that the dropout regularizer is first-order equivalent to an L2 regularizer applied after scaling the features by an estimate of the inverse diagonal Fisher information matrix....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1710.05092 شماره
صفحات -
تاریخ انتشار 2017